11 research outputs found

    Europe's Space capabilities for the benefit of the Arctic

    Get PDF
    In recent years, the Arctic region has acquired an increasing environmental, social, economic and strategic importance. The Arctic’s fragile environment is both a direct and key indicator of the climate change and requires specific mitigation and adaptation actions. The EU has a clear strategic interest in playing a key role and is actively responding to the impacts of climate change safeguarding the Arctic’s fragile ecosystem, ensuring a sustainable development, particularly in the European part of the Arctic. The European Commission’s Joint Research Centre has recently completed a study aimed at identifying the capabilities and relevant synergies across the four domains of the EU Space Programme: earth observation, satellite navigation, satellite communications, and space situational awareness (SSA). These synergies are expected to be key enablers of new services that will have a high societal impact in the region, which could be developed in a more cost-efficient and rapid manner. Similarly, synergies will also help exploit to its full extent operational services that are already deployed in the Arctic (e.g., the Copernicus emergency service or the Galileo Search and rescue service could greatly benefit from improved satellite communications connectivity in the region).JRC.E.2-Technology Innovation in Securit

    Unifying European Biodiversity Informatics (BioUnify)

    Get PDF
    In order to preserve the variety of life on Earth, we must understand it better. Biodiversity research is at a pivotal point with research projects generating data at an ever increasing rate. Structuring, aggregating, linking and processing these data in a meaningful way is a major challenge. The systematic application of information management and engineering technologies in the study of biodiversity (biodiversity informatics) help transform data to knowledge. However, concerted action is required to be taken by existing e-infrastructures to develop and adopt common standards, provisions for interoperability and avoid overlapping in functionality. This would result in the unification of the currently fragmented landscape that restricts European biodiversity research from reaching its full potential. The overarching goal of this COST Action is to coordinate existing research and capacity building efforts, through a bottom-up trans-disciplinary approach, by unifying biodiversity informatics communities across Europe in order to support the long-term vision of modelling biodiversity on earth. BioUnify will: 1. specify technical requirements, evaluate and improve models for efficient data and workflow storage, sharing and re-use, within and between different biodiversity communities; 2. mobilise taxonomic, ecological, genomic and biomonitoring data generated and curated by natural history collections, research networks and remote sensing sources in Europe; 3. leverage results of ongoing biodiversity informatics projects by identifying and developing functional synergies on individual, group and project level; 4. raise technical awareness and transfer skills between biodiversity researchers and information technologists; 5. formulate a viable roadmap for achieving the long-term goals for European biodiversity informatics, which ensures alignment with global activities and translates into efficient biodiversity policy

    Publishing OGC resources discovered on the mainstream web in an SDI catalogue

    No full text
    Nowadays geospatial data users search for geospatial information within an SDI using discovery clients of a Geoportal application (i.e. INSPIRE Geoportal). If data producers want to promote related resources and make them available in the SDI, then they need to create metadata according to the predefined rules (i.e. INSPIRE metadata regulation) and publish them using a CSW standard. This approach allows for either distributed searches or harvesting metadata from different SDI nodes. Nevertheless, there are still a lot of data producers making their resources available on the Web without documenting and publishing in a standardised way. The paper describes a workflow to provide a tool to make OGC-based geospatial services found on the Internet discoverable through CSW-compatible service catalogues and, hence, more visible to a wider SDI community.JRC.H.6-Digital Earth and Reference Dat

    Discovery of Geospatial Information Resources on the Web

    No full text
    Nowadays, the web provides different methods for discovery and retrieval of geospatial information (GI) resources. For an ordinary internet user, the most used information systems to retrieve GI resources are so called spatial browsing systems such as Google Maps. Spatial Data Infrastructure (SDI) provides more detailed information on specific spatial data themes with advanced levels of searching facilities based on a detailed documentation - metadata. Current research activities have reported that so called mainstream web provides many valuable GI resources of several types (e.g. OGC Services, KML data, etc.). They can be discovered using web search engines (SE) such as Google, Yahoo or Bing. The paper describes individual steps of a workflow for an automatic metadata extraction from the information provided by OGC web services and further extensions in order to fulfil the INSPIRE legislation requirements and related standardization for digital geographic information. Additionally a workflow for searching GI resources provided by OGC services on the mainstream web represented by Google SEs is elucidated.JRC.H.6-Digital Earth and Reference Dat

    Parallel Processing Strategies for Geospatial Data in a Cloud Computing Infrastructure

    No full text
    This paper is on the optimization of computing resources to process geospatial image data in a cloud computing infrastructure. Parallelization was tested by combining two different strategies: image tiling and multi-threading. The objective here was to get insight on the optimal use of available processing resources in order to minimize the processing time. Maximum speedup was obtained when combining tiling and multi-threading techniques. Both techniques are complementary, but a trade-off also exists. Speedup is improved with tiling, as parts of the image can run in parallel. But reading part of the image introduces an overhead and increases the relative part of the program that can only run in serial. This limits speedup that can be achieved via multi-threading. The optimal strategy of tiling and multi-threading that maximizes speedup depends on the scale of the application (global or local processing area), the implementation of the algorithm (processing libraries), and on the available computing resources (amount of memory and cores). A medium-sized virtual server that has been obtained from a cloud service provider has rather limited computing resources. Tiling will not only improve speedup but can be necessary to reduce the memory footprint. However, a tiling scheme with many small tiles increases overhead and can introduce extra latency due to queued tiles that are waiting to be processed. In a high-throughput computing cluster with hundreds of physical processing cores, more tiles can be processed in parallel, and the optimal strategy will be different. A quantitative assessment of the speedup was performed in this study, based on a number of experiments for different computing environments. The potential and limitations of parallel processing by tiling and multi-threading were hereby assessed. Experiments were based on an implementation that relies on an application programming interface (API) abstracting any platform-specific details, such as those related to data access

    Parallel Processing Strategies for Geospatial Data in a Cloud Computing Infrastructure

    No full text
    This paper is on the optimization of computing resources to process geospatial image data in a cloud computing infrastructure. Parallelization was tested by combining two different strategies: image tiling and multi-threading. The objective here was to get insight on the optimal use of available processing resources in order to minimize the processing time. Maximum speedup was obtained when combining tiling and multi-threading techniques. Both techniques are complementary, but a trade-off also exists. Speedup is improved with tiling, as parts of the image can run in parallel. But reading part of the image introduces an overhead and increases the relative part of the program that can only run in serial. This limits speedup that can be achieved via multi-threading. The optimal strategy of tiling and multi-threading that maximizes speedup depends on the scale of the application (global or local processing area), the implementation of the algorithm (processing libraries), and on the available computing resources (amount of memory and cores). A medium-sized virtual server that has been obtained from a cloud service provider has rather limited computing resources. Tiling will not only improve speedup but can be necessary to reduce the memory footprint. However, a tiling scheme with many small tiles increases overhead and can introduce extra latency due to queued tiles that are waiting to be processed. In a high-throughput computing cluster with hundreds of physical processing cores, more tiles can be processed in parallel, and the optimal strategy will be different. A quantitative assessment of the speedup was performed in this study, based on a number of experiments for different computing environments. The potential and limitations of parallel processing by tiling and multi-threading were hereby assessed. Experiments were based on an implementation that relies on an application programming interface (API) abstracting any platform-specific details, such as those related to data access

    Investigating the state of physiologically based kinetic modelling practices and challenges associated with gaining regulatory acceptance of model applications

    No full text
    Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over the last several decades, especially in conjunction with emerging alternative methods to animal testing, such as in vitro studies and data driven in silico quantitative-structure-activity-relationship (QSAR) predictions. Experimental information derived from these new approach methods can be used as input for model parameters and allows for increased confidence in models for chemicals that did not have in vivo data for model calibration. Despite significant advancements in good modelling practice (GMP) for model development and evaluation, there remains some reluctance among regulatory agencies to use such models during the risk assessment process. Here, the results of a survey disseminated to the modelling community are presented in order to inform the frequency of use and applications of PBK models in science and regulatory submission. Additionally, the survey was designed to identify a network of investigators involved in PBK modelling and knowledgeable of GMP so that they might be contacted in the future for peer review of PBK models, especially in regards to vetting the models to such a degree as to gain a greater acceptance for regulatory purposes.JRC.F.3-Chemicals Safety and Alternative Method

    Capturing the applicability of in vitro-in silico membrane transporter data in chemical risk assessment and biomedical research

    No full text
    Costs, scientific and ethical concerns related to animal tests for regulatory decision-making have stimulated the development of alternative methods. When applying alternative approaches, kinetics have been identified as a key element to consider. Membrane transporters affect the kinetic processes of absorption, distribution, metabolism and excretion (ADME) of various compounds, such as drugs or environmental chemicals. Therefore, pharmaceutical scientists have intensively studied transporters impacting drug efficacy and safety. Besides pharmacokinetics, transporters are considered as major determinant of toxicokinetics, potentially representing an essential piece of information in chemical risk assessment. To capture the applicability of transporter data for kinetic-based risk assessment in non-pharmaceutical sectors, the EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) created a survey with a view of identifying the improvements needed when using in vitro and in silico methods. Seventy-three participants, from different sectors and with various kinds of expertise, completed the survey. The results revealed that transporters are investigated mainly during drug development, but also for risk assessment purposes of food and feed contaminants, industrial chemicals, cosmetics, nano materials and in the context of environmental toxicology, by applying both in vitro and in silico tools. However, to rely only on alternative methods for chemical risk assessment, it is critical that the data generated by in vitro and in silico methods are scientific integer, reproducible and of high quality so that they are trusted by decision makers and used by industry. In line, the respondents identified various challenges related to the interpretation and use of transporter data from non-animal methods. Overall, it was determined that a combined mechanistically-anchored in vitro-in silico approach, validated against available human data, would gain confidence in using transporter data within an animal-free risk assessment paradigm. Finally, respondents involved primarily in fundamental research expressed lower confidence in non-animal studies to unravel complex transporter mechanismsJRC.F.3-Chemicals Safety and Alternative Method

    A Surveillance System for Enhancing the Safety of Rescue Teams

    No full text
    The article summarizes preliminary results of the research and development of a system focused on enhancing the safety of teams participating in the integrated rescue system managing extraordinary events or crisis situations (fire, mass disaster, release of harmful industrial substances), and on the support in the course of training. Individual partial technical solutions are mentioned, which should lead to providing automatized telemetric monitoring equipment in a more resistant form making it possible to recognize the nature and intensity of the motion, including the determination of the topical and total energy outputs, monitoring of environmental parameters (temperature, smoke, etc.) and back analysis of the intervention course or training in real time, and the monitoring of health-physiological parameters and signalling risk conditions (physical exhaustion, stress, overheating, etc.) under extreme measures
    corecore